Neural Abstractive Text Summarization with Sequence-to-Sequence Models
نویسندگان
چکیده
In the past few years, neural abstractive text summarization with sequence-to-sequence (seq2seq) models have gained a lot of popularity. Many interesting techniques been proposed to improve seq2seq models, making them capable handling different challenges, such as saliency, fluency and human readability, generate high-quality summaries. Generally speaking, most these differ in one three categories: network structure, parameter inference, decoding/generation. There are also other concerns, efficiency parallelism for training model. this article, we provide comprehensive literature survey on from viewpoint structures, strategies, summary generation algorithms. Several were first language modeling tasks, machine translation, later applied summarization. Hence, brief review models. As part survey, develop an open source library, namely, Neural Abstractive Text Summarizer (NATS) toolkit, An extensive set experiments conducted widely used CNN/Daily Mail dataset examine effectiveness several components. Finally, benchmark two implemented NATS recently released datasets, Newsroom Bytecup.
منابع مشابه
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
In this work, we model abstractive text summarization using Attentional EncoderDecoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. We propose several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentenc...
متن کاملNeural Abstractive Text Summarization
Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...
متن کاملSequence-to-Sequence RNNs for Text Summarization
In this work, we cast text summarization as a sequence-to-sequence problem and apply the attentional encoder-decoder RNN that has been shown to be successful for Machine Translation (Bahdanau et al. (2014)). Our experiments show that the proposed architecture significantly outperforms the state-of-the art model of Rush et al. (2015) on the Gigaword dataset without any additional tuning. We also...
متن کاملImproving Neural Abstractive Text Summarization with Prior Knowledge (Position Paper)
Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. In this position paper we address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNN). Moreover, we discus...
متن کاملText Generation for Abstractive Summarization
We have begun work on a framework for abstractive summarization and decided to focus on a module for text generation. For TAC 2010, we thus move away from sentence extraction. Each sentence in the summary we generate is based on a document sentence but it usually contains a smaller amount of information and uses fewer words. The system uses the output of a syntactic parser for a sentence and th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ACM/IMS transactions on data science
سال: 2021
ISSN: ['2691-1922', '2577-3224']
DOI: https://doi.org/10.1145/3419106